101 research outputs found

    Eigenvalue spectral properties of sparse random matrices obeying Dale's law

    Full text link
    Understanding the dynamics of large networks of neurons with heterogeneous connectivity architectures is a complex physics problem that demands novel mathematical techniques. Biological neural networks are inherently spatially heterogeneous, making them difficult to mathematically model. Random recurrent neural networks capture complex network connectivity structures and enable mathematically tractability. Our paper generalises previous classical results to sparse connectivity matrices which have distinct excitatory (E) or inhibitory (I) neural populations. By investigating sparse networks we construct our analysis to examine the impacts of all levels of network sparseness, and discover a novel nonlinear interaction between the connectivity matrix and resulting network dynamics, in both the balanced and unbalanced cases. Specifically, we deduce new mathematical dependencies describing the influence of sparsity and distinct E/I distributions on the distribution of eigenvalues (eigenspectrum) of the networked Jacobian. Furthermore, we illustrate that the previous classical results are special cases of the more general results we have described here. Understanding the impacts of sparse connectivities on network dynamics is of particular importance for both theoretical neuroscience and mathematical physics as it pertains to the structure-function relationship of networked systems and their dynamics. Our results are an important step towards developing analysis techniques that are essential to studying the impacts of larger scale network connectivity on network function, and furthering our understanding of brain function and dysfunction.Comment: 18 pages, 6 figure

    Spike history model for neural control

    Full text link

    Brazing techniques for the fabrication of biocompatible carbon-based electronic devices

    Get PDF
    Prototype electronic devices have been critical to the discovery and demonstration of the unique properties of new materials, including composites based on carbon nanotubes (CNT) and graphene. However, these devices are not typically constructed with durability or biocompatibility in mind, relying on conductive polymeric adhesives, mechanical clamps or crimps, or solders for electrical connections. In this paper, two key metallization techniques are presented that employ commercially-available brazing alloys to fabricate electronic devices based on diamond and carbonaceous wires. Investigation of the carbon - alloy interfacial interactions was utilized to guide device fabrication. The interplay of both chemical ( adhesive ) and mechanical ( cohesive ) forces at the interface of different forms of carbon was exploited to fabricate either freestanding or substrate-fixed carbonaceous electronic devices. Elemental analysis in conjunction with scanning electron microscopy of the carbon - alloy interface revealed the chemical nature of the Ag alloy bond and the mechanical nature of the Au alloy bond. Electrical characterization revealed the non-rectifying nature of the carbon - Au alloy interconnects. Finally, electronic devices were fabricated, including a Au circuit structure embedded in a polycrystalline diamond substrate

    Homeostatic Scaling of Excitability in Recurrent Neural Networks

    Get PDF
    Neurons adjust their intrinsic excitability when experiencing a persistent change in synaptic drive. This process can prevent neural activity from moving into either a quiescent state or a saturated state in the face of ongoing plasticity, and is thought to promote stability of the network in which neurons reside. However, most neurons are embedded in recurrent networks, which require a delicate balance between excitation and inhibition to maintain network stability. This balance could be disrupted when neurons independently adjust their intrinsic excitability. Here, we study the functioning of activity-dependent homeostatic scaling of intrinsic excitability (HSE) in a recurrent neural network. Using both simulations of a recurrent network consisting of excitatory and inhibitory neurons that implement HSE, and a mean-field description of adapting excitatory and inhibitory populations, we show that the stability of such adapting networks critically depends on the relationship between the adaptation time scales of both neuron populations. In a stable adapting network, HSE can keep all neurons functioning within their dynamic range, while the network is undergoing several (patho)physiologically relevant types of plasticity, such as persistent changes in external drive, changes in connection strengths, or the loss of inhibitory cells from the network. However, HSE cannot prevent the unstable network dynamics that result when, due to such plasticity, recurrent excitation in the network becomes too strong compared to feedback inhibition. This suggests that keeping a neural network in a stable and functional state requires the coordination of distinct homeostatic mechanisms that operate not only by adjusting neural excitability, but also by controlling network connectivity

    Soft-bound synaptic plasticity increases storage capacity

    Get PDF
    Accurate models of synaptic plasticity are essential to understand the adaptive properties of the nervous system and for realistic models of learning and memory. Experiments have shown that synaptic plasticity depends not only on pre- and post-synaptic activity patterns, but also on the strength of the connection itself. Namely, weaker synapses are more easily strengthened than already strong ones. This so called soft-bound plasticity automatically constrains the synaptic strengths. It is known that this has important consequences for the dynamics of plasticity and the synaptic weight distribution, but its impact on information storage is unknown. In this modeling study we introduce an information theoretic framework to analyse memory storage in an online learning setting. We show that soft-bound plasticity increases a variety of performance criteria by about 18% over hard-bound plasticity, and likely maximizes the storage capacity of synapses

    STDP Allows Fast Rate-Modulated Coding with Poisson-Like Spike Trains

    Get PDF
    Spike timing-dependent plasticity (STDP) has been shown to enable single neurons to detect repeatedly presented spatiotemporal spike patterns. This holds even when such patterns are embedded in equally dense random spiking activity, that is, in the absence of external reference times such as a stimulus onset. Here we demonstrate, both analytically and numerically, that STDP can also learn repeating rate-modulated patterns, which have received more experimental evidence, for example, through post-stimulus time histograms (PSTHs). Each input spike train is generated from a rate function using a stochastic sampling mechanism, chosen to be an inhomogeneous Poisson process here. Learning is feasible provided significant covarying rate modulations occur within the typical timescale of STDP (∼10–20 ms) for sufficiently many inputs (∼100 among 1000 in our simulations), a condition that is met by many experimental PSTHs. Repeated pattern presentations induce spike-time correlations that are captured by STDP. Despite imprecise input spike times and even variable spike counts, a single trained neuron robustly detects the pattern just a few milliseconds after its presentation. Therefore, temporal imprecision and Poisson-like firing variability are not an obstacle to fast temporal coding. STDP provides an appealing mechanism to learn such rate patterns, which, beyond sensory processing, may also be involved in many cognitive tasks

    Growth Rules for the Repair of Asynchronous Irregular Neuronal Networks after Peripheral Lesions

    Get PDF
    © 2021 Sinha et al. This is an open access article distributed under the terms of the Creative Commons Attribution License. https://creativecommons.org/licenses/by/4.0/Several homeostatic mechanisms enable the brain to maintain desired levels of neuronal activity. One of these, homeostatic structural plasticity, has been reported to restore activity in networks disrupted by peripheral lesions by altering their neuronal connectivity. While multiple lesion experiments have studied the changes in neurite morphology that underlie modifications of synapses in these networks, the underlying mechanisms that drive these changes are yet to be explained. Evidence suggests that neuronal activity modulates neurite morphology and may stimulate neurites to selective sprout or retract to restore network activity levels. We developed a new spiking network model of peripheral lesioning and accurately reproduced the characteristics of network repair after deafferentation that are reported in experiments to study the activity dependent growth regimes of neurites. To ensure that our simulations closely resemble the behaviour of networks in the brain, we model deafferentation in a biologically realistic balanced network model that exhibits low frequency Asynchronous Irregular (AI) activity as observed in cerebral cortex. Our simulation results indicate that the re-establishment of activity in neurons both within and outside the deprived region, the Lesion Projection Zone (LPZ), requires opposite activity dependent growth rules for excitatory and inhibitory post-synaptic elements. Analysis of these growth regimes indicates that they also contribute to the maintenance of activity levels in individual neurons. Furthermore, in our model, the directional formation of synapses that is observed in experiments requires that pre-synaptic excitatory and inhibitory elements also follow opposite growth rules. Lastly, we observe that our proposed structural plasticity growth rules and the inhibitory synaptic plasticity mechanism that also balances our AI network both contribute to the restoration of the network to pre-deafferentation stable activity levels.Peer reviewe
    corecore